Duration: 6-12+ Months
Location: Hybrid – Chicago, IL
We are seeking an experienced Data Architect to join a dynamic team dedicated to leveraging data-driven decisions and automation to drive organizational success. This is a strategic, hands-on role focused on designing and implementing enterprise-level data solutions.
The ideal candidate will have strong AWS (5+ years), Snowflake (3+ years), and API expertise, as well as a proven track record in creating and managing robust data architectures. Certifications in AWS or Snowflake are highly desirable.
This is a true architectural role, not engineering-focused, requiring leadership in defining data flows, aggregation, curation, and governance while delivering scalable and secure data pipelines. The role collaborates closely with data engineering, product, and science teams to align architecture with business goals.
Key Responsibilities:
• Lead the creation of strategic enterprise data architectures to meet business and technical requirements.
• Partner with stakeholders to define principles, standards, and guidelines for data aggregation, migration, curation, modeling, consumption, and placement.
• Ensure adherence to data architecture standards, policies, and regulatory guidelines across initiatives.
• Design and validate data models for scalable, commercial-grade data pipelines.
• Provide expertise in end-to-end data pipeline architecture, including security reviews and deployment strategies.
• Prioritize and manage data intake requests, performing cost/benefit analysis to drive decision-making.
• Design data lakes, warehouses, and self-service reporting capabilities to enable insights for business leaders.
• Automate reporting and create robust data quality management frameworks.
Required Skills and Qualifications:
• 6+ years of experience in data engineering or related technical roles, with a minimum of 4+ years in data architecture.
• Expertise in AWS cloud solutions, Snowflake, and API development.
• Familiarity with BI tools like Tableau, ThoughtSpot, PowerBI, and Looker.
• Bachelor’s degree in a quantitative field (e.g., Computer Science, Engineering, Mathematics). Advanced degrees are a plus.
Preferred Skills:
• Proven ability to design for data quality management tooling.
• Certifications in AWS or Snowflake are highly desirable.